|
In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator (in a mean square error sense) is orthogonal to any possible estimator. The orthogonality principle is most commonly stated for linear estimators, but more general formulations are possible. Since the principle is a necessary and sufficient condition for optimality, it can be used to find the minimum mean square error estimator. == Orthogonality principle for linear estimators == The orthogonality principle is most commonly used in the setting of linear estimation.〔Kay, p.386〕 In this context, let ''x'' be an unknown random vector which is to be estimated based on the observation vector ''y''. One wishes to construct a linear estimator for some matrix ''H'' and vector ''c''. Then, the orthogonality principle states that an estimator achieves minimum mean square error if and only if * and * If ''x'' and ''y'' have zero mean, then it suffices to require the first condition. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Orthogonality principle」の詳細全文を読む スポンサード リンク
|